99 research outputs found

    Automatic Calibration and Equalization of a Line Array System

    Get PDF
    This final project presents an automated Public Address processing unit, using delay and magnitude frequency response adjustment. The aim is to achieve a flat frequency response and delay adjustment between different physically-placed speakers at the measuring point, which is nowadays usually made manually by the sound technician. The adjustment is obtained using four signal processing operations to the audio signal: time delay adjustment, crossover filtering, gain adjustment, and graphic equalization. The automation is in the calculation of different parameter sets: estimation of the time delay, the selection of a suitable crossover frequency, and calculation of the gains for a third-octave graphic equalizer. These automatic methods reduce time and effort in the calibration of line-array PA systems, since only three sine sweeps must be played though the sound system. For verifying the functioning of the system, both simulated signals and measurements have been conducted. A 1:10 scale model of a line array system has been designed and constructed in an anechoic chamber to test the automatic calibration and equalization methods and the results are analyzed

    Can One Trust Quantum Simulators?

    Full text link
    Various fundamental phenomena of strongly-correlated quantum systems such as high-TcT_c superconductivity, the fractional quantum-Hall effect, and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models that are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper [Int. J. Theor. Phys. 21, 467], Richard Feynman suggested that such models might be solved by "simulation" with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a "quantum simulator," would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability, and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question "Can we trust quantum simulators?" is... to some extent.Comment: 20 pages. Minor changes with respect to version 2 (some additional explanations, added references...

    White Paper on Digital and Complex Information

    Get PDF
    Information is one of the main traits of the contemporary era. Indeed there aremany perspectives to define the present times, such as the Digital Age, the Big Dataera, the Fourth Industrial Revolution, the fourth Paradigm of science, and in all ofthem information, gathered, stored, processed and transmitted, plays a key role.Technological developments in the last decades such as powerful computers, cheaperand miniaturized solutions as smartphones, massive optical communication, or theInternet, to name few, have enabled this shift to the Information age. This shift hasdriven daily life, cultural and social deep changes, in work and personal activities,on access to knowledge, information spreading, altering interpersonal relations orthe way we interact in public and private sphere, in economy and politics, pavingthe way to globalizationPeer reviewe

    Measurement of ISR-FSR interference in the processes e+ e- --> mu+ mu- gamma and e+ e- --> pi+ pi- gamma

    Get PDF
    Charge asymmetry in processes e+ e- --> mu+ mu- gamma and e+ e- --> pi+ pi- gamma is measured using 232 fb-1 of data collected with the BABAR detector at center-of-mass energies near 10.58 GeV. An observable is introduced and shown to be very robust against detector asymmetries while keeping a large sensitivity to the physical charge asymmetry that results from the interference between initial and final state radiation. The asymmetry is determined as afunction of the invariant mass of the final-state tracks from production threshold to a few GeV/c2. It is compared to the expectation from QED for e+ e- --> mu+ mu- gamma and from theoretical models for e+ e- --> pi+ pi- gamma. A clear interference pattern is observed in e+ e- --> pi+ pi- gamma, particularly in the vicinity of the f_2(1270) resonance. The inferred rate of lowest order FSR production is consistent with the QED expectation for e+ e- --> mu+ mu- gamma, and is negligibly small for e+ e- --> pi+ pi- gamma.Comment: 32 pages,29 figures, to be submitted to Phys. Rev.

    MICADO PSF-reconstruction work package description

    Get PDF
    The point spread function reconstruction (PSF-R) capability is a deliverable of the MICADO@ESO-ELT project. The PSF-R team works on the implementation of the instrument software devoted to reconstruct the point spread function (PSF), independently of the science data, using adaptive optics (AO) telemetry data, both for Single Conjugate (SCAO) and Multi-Conjugate Adaptive Optics (MCAO) mode of the MICADO camera and spectrograph. The PSF-R application will provide reconstructed PSFs through an archive querying system to restore the telemetry data synchronous to each science frame that MICADO will generate. Eventually, the PSF-R software will produce the output according to user specifications. The PSF-R service will support the state-of-the-art scientific analysis of the MICADO imaging and spectroscopic data

    Pervasive gaps in Amazonian ecological research

    Get PDF
    Biodiversity loss is one of the main challenges of our time,1,2 and attempts to address it require a clear un derstanding of how ecological communities respond to environmental change across time and space.3,4 While the increasing availability of global databases on ecological communities has advanced our knowledge of biodiversity sensitivity to environmental changes,5–7 vast areas of the tropics remain understudied.8–11 In the American tropics, Amazonia stands out as the world’s most diverse rainforest and the primary source of Neotropical biodiversity,12 but it remains among the least known forests in America and is often underrepre sented in biodiversity databases.13–15 To worsen this situation, human-induced modifications16,17 may elim inate pieces of the Amazon’s biodiversity puzzle before we can use them to understand how ecological com munities are responding. To increase generalization and applicability of biodiversity knowledge,18,19 it is thus crucial to reduce biases in ecological research, particularly in regions projected to face the most pronounced environmental changes. We integrate ecological community metadata of 7,694 sampling sites for multiple or ganism groups in a machine learning model framework to map the research probability across the Brazilian Amazonia, while identifying the region’s vulnerability to environmental change. 15%–18% of the most ne glected areas in ecological research are expected to experience severe climate or land use changes by 2050. This means that unless we take immediate action, we will not be able to establish their current status, much less monitor how it is changing and what is being lostinfo:eu-repo/semantics/publishedVersio

    Pervasive gaps in Amazonian ecological research

    Get PDF

    National trends in total cholesterol obscure heterogeneous changes in HDL and non-HDL cholesterol and total-to-HDL cholesterol ratio : a pooled analysis of 458 population-based studies in Asian and Western countries

    Get PDF
    Background: Although high-density lipoprotein (HDL) and non-HDL cholesterol have opposite associations with coronary heart disease, multi-country reports of lipid trends only use total cholesterol (TC). Our aim was to compare trends in total, HDL and nonHDL cholesterol and the total-to-HDL cholesterol ratio in Asian and Western countries. Methods: We pooled 458 population-based studies with 82.1 million participants in 23 Asian and Western countries. We estimated changes in mean total, HDL and non-HDL cholesterol and mean total-to-HDL cholesterol ratio by country, sex and age group. Results: Since similar to 1980, mean TC increased in Asian countries. In Japan and South Korea, the TC rise was due to rising HDL cholesterol, which increased by up to 0.17 mmol/L per decade in Japanese women; in China, it was due to rising non-HDL cholesterol. TC declined in Western countries, except in Polish men. The decline was largest in Finland and Norway, at similar to 0.4 mmol/L per decade. The decline in TC in most Western countries was the net effect of an increase in HDL cholesterol and a decline in non-HDL cholesterol, with the HDL cholesterol increase largest in New Zealand and Switzerland. Mean total-to-HDL cholesterol ratio declined in Japan, South Korea and most Western countries, by as much as similar to 0.7 per decade in Swiss men (equivalent to similar to 26% decline in coronary heart disease risk per decade). The ratio increased in China. Conclusions: HDL cholesterol has risen and the total-to-HDL cholesterol ratio has declined in many Western countries, Japan and South Korea, with only a weak correlation with changes in TC or non-HDL cholesterol.Peer reviewe

    Repositioning of the global epicentre of non-optimal cholesterol

    Get PDF
    High blood cholesterol is typically considered a feature of wealthy western countries(1,2). However, dietary and behavioural determinants of blood cholesterol are changing rapidly throughout the world(3) and countries are using lipid-lowering medications at varying rates. These changes can have distinct effects on the levels of high-density lipoprotein (HDL) cholesterol and non-HDL cholesterol, which have different effects on human health(4,5). However, the trends of HDL and non-HDL cholesterol levels over time have not been previously reported in a global analysis. Here we pooled 1,127 population-based studies that measured blood lipids in 102.6 million individuals aged 18 years and older to estimate trends from 1980 to 2018 in mean total, non-HDL and HDL cholesterol levels for 200 countries. Globally, there was little change in total or non-HDL cholesterol from 1980 to 2018. This was a net effect of increases in low- and middle-income countries, especially in east and southeast Asia, and decreases in high-income western countries, especially those in northwestern Europe, and in central and eastern Europe. As a result, countries with the highest level of non-HDL cholesterol-which is a marker of cardiovascular riskchanged from those in western Europe such as Belgium, Finland, Greenland, Iceland, Norway, Sweden, Switzerland and Malta in 1980 to those in Asia and the Pacific, such as Tokelau, Malaysia, The Philippines and Thailand. In 2017, high non-HDL cholesterol was responsible for an estimated 3.9 million (95% credible interval 3.7 million-4.2 million) worldwide deaths, half of which occurred in east, southeast and south Asia. The global repositioning of lipid-related risk, with non-optimal cholesterol shifting from a distinct feature of high-income countries in northwestern Europe, north America and Australasia to one that affects countries in east and southeast Asia and Oceania should motivate the use of population-based policies and personal interventions to improve nutrition and enhance access to treatment throughout the world.Peer reviewe
    corecore